Proximal gradient methods beyond monotony

نویسندگان

چکیده

We address composite optimization problems, which consist in minimizing the sum of a smooth and merely lower semicontinuous function, without any convexity assumptions. Numerical solutions these problems can be obtained by proximal gradient methods, often rely on line search procedure as globalization mechanism. consider an adaptive nonmonotone scheme based averaged merit function establish asymptotic convergence guarantees under weak assumptions, delivering results par with monotone strategy. Global worst-case rates for iterates stationarity measure are also derived. Finally, numerical example indicates potential nonmonotonicity spectral approximations.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Distributed Delayed Proximal Gradient Methods

We analyze distributed optimization algorithms where parts of data and variables are distributed over several machines and synchronization occurs asynchronously. We prove convergence for the general case of a nonconvex objective plus a convex and possibly nonsmooth penalty. We demonstrate two challenging applications, `1-regularized logistic regression and reconstruction ICA, and present experi...

متن کامل

Proximal extrapolated gradient methods for variational inequalities

The paper concerns with novel first-order methods for monotone variational inequalities. They use a very simple linesearch procedure that takes into account a local information of the operator. Also, the methods do not require Lipschitz continuity of the operator and the linesearch procedure uses only values of the operator. Moreover, when the operator is affine our linesearch becomes very simp...

متن کامل

Distributed Accelerated Proximal Coordinate Gradient Methods

We develop a general accelerated proximal coordinate descent algorithm in distributed settings (DisAPCG) for the optimization problem that minimizes the sum of two convex functions: the first part f is smooth with a gradient oracle, and the other one Ψ is separable with respect to blocks of coordinate and has a simple known structure (e.g., L1 norm). Our algorithm gets new accelerated convergen...

متن کامل

Accelerated Proximal Gradient Methods for Nonconvex Programming

Nonconvex and nonsmooth problems have recently received considerable attention in signal/image processing, statistics and machine learning. However, solving the nonconvex and nonsmooth optimization problems remains a big challenge. Accelerated proximal gradient (APG) is an excellent method for convex programming. However, it is still unknown whether the usual APG can ensure the convergence to a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of nonsmooth analysis and optimization

سال: 2023

ISSN: ['2700-7448']

DOI: https://doi.org/10.46298/jnsao-2023-10290